Conic reformulations for Kullback-Leibler divergence constrained distributionally robust optimization and applications
نویسندگان
چکیده
In this paper, we consider a Kullback-Leibler divergence constrained distributionally robust optimization model. This model considers an ambiguity set that consists of all distributions whose to empirical distribution is bounded. Utilizing the fact measure has exponential cone representation, obtain counterpart problem as dual program under mild assumptions on underlying problem. The resulting conic reformulation original can be directly solved by commercial programming solver. We specialize our generic formulation two classical problems, namely, Newsvendor Problem and Uncapacitated Facility Location Problem. Our computational study in out-of-sample analysis shows solutions obtained via approach yield significantly better performance terms dispersion cost realizations while central tendency deteriorates only slightly compared stochastic programming.
منابع مشابه
Kullback-Leibler Divergence Constrained Distributionally Robust Optimization
In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability distribution is defined by the Kullback-Leibler (KL) divergence. We consider DRO problems where the ambiguity is in the objective function, which takes a form of an expectation, and show that the resulted minimax DRO problems can be formulated as a one-layer convex minimization ...
متن کاملA Kullback-Leibler Divergence-based Distributionally Robust Optimization Model for Heat Pump Day-ahead Operational Schedule in Distribution Networks
For its high coefficient of performance and zero local emissions, the heat pump (HP) has recently become popular in North Europe and China. However, the integration of HPs may aggravate the daily peak-valley gap in distribution networks significantly.
متن کاملInformation Graphs for Epidemiological Applications of the Kullback-Leibler Divergence
Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...
متن کاملInformation graphs for epidemiological applications of the Kullback-Leibler divergence.
Dear Editor, The topic addressed by this brief communication is the quantification of diagnostic information and, in particular, a method of illustrating graphically the quantification of diagnostic information for binary tests. To begin, consider the following outline procedure for development of such a test. An appropriate indicator variable that will serve as a proxy for the actual variable ...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Optimization and Control : Theories & Applications
سال: 2021
ISSN: ['2146-5703', '2146-0957']
DOI: https://doi.org/10.11121/ijocta.01.2021.001001